# Multilingual Instructions

Typhoon2.1 Gemma3 4b
Thai large language model (instruction-tuned version) with 4 billion parameters, 128K context length, and function calling capability
Large Language Model Safetensors
T
scb10x
2,083
3
Qwen3 14B GPTQ Int4
Apache-2.0
Qwen3-4B is the latest 4-billion-parameter large language model in the Qwen series, supporting switching between thinking and non-thinking modes, with excellent performance in reasoning, multilingual, and agent tasks.
Large Language Model Transformers
Q
JunHowie
640
2
Qwen3 1.7B FP8
Apache-2.0
Qwen3-1.7B-FP8 is the FP8 version of the latest generation of the Qwen series of large language models, with powerful inference, instruction-following, agent interaction, and multilingual support capabilities.
Large Language Model Transformers
Q
Qwen
5,645
26
Qwen3 4B
Apache-2.0
Qwen3-4B is the latest version in the Qwen series of large language models, offering a 4-billion-parameter model that supports switching between thinking and non-thinking modes, with powerful reasoning, instruction-following, and multilingual capabilities.
Large Language Model Transformers
Q
Qwen
307.26k
195
Qwen3 1.7B
Apache-2.0
Qwen3 is the latest generation of large language models in the Tongyi Qianwen series, offering a complete set of dense models and Mixture of Experts (MoE) models. Based on large-scale training, Qwen3 has achieved breakthroughs in reasoning, instruction following, agent capabilities, and multilingual support.
Large Language Model Transformers
Q
Qwen
395.72k
113
Yulan Mini Instruct
MIT
YuLan-Mini-Instruct is a compact yet powerful 2.4-billion-parameter text generation model, specializing in mathematical and code reasoning tasks with support for both English and Chinese.
Large Language Model Transformers Supports Multiple Languages
Y
yulan-team
97
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase